Last Update: 7/13/2025
SenseFlow Text Completion API
The SenseFlow Text Completion API allows you to generate text content without maintaining conversation context. This is suitable for tasks like translation, article writing, and AI summarization.
Endpoints
Generate Completion
POST https://platform.llmprovider.ai/v1/agent/completion-messages
Request Headers
| Header | Value |
|---|---|
| Authorization | Bearer YOUR_API_KEY |
| Content-Type | application/json |
Request Body
| Parameter | Type | Description |
|---|---|---|
| agent | string | model name. |
| inputs | object | (Optional) Key-value pairs of variables defined in the app. |
| response_mode | string | Response mode: streaming (recommended) or blocking |
| user | string | Unique identifier for the end user |
| files | array | (Optional) Array of file objects for upload |
Files Object Structure
| Field | Type | Description |
|---|---|---|
| type | string | File type (currently only supports 'image') |
| transfer_method | string | remote_url or local_file |
| url | string | Image URL (when transfer_method is remote_url) |
| upload_file_id | string | File ID (when transfer_method is local_file) |
Example Request
{
"model": "",
"inputs": {
"query": "Hello, world!"
},
"response_mode": "streaming",
"user": "abc-123"
}
Response
The response varies based on the response_mode:
- For
blockingmode: Returns aChatCompletionResponseobject - For
streamingmode: Returns a stream ofChunkChatCompletionResponseobjects
ChatCompletionResponse Structure
| Field | Type | Description |
|---|---|---|
| message_id | string | Unique message identifier |
| mode | string | Fixed as "chat" |
| answer | string | Complete response content |
| metadata | object | Metadata information |
| usage | object | Model usage information |
| created_at | integer | Message creation timestamp |
Streaming Response Events
Each streaming chunk starts with data: and chunks are separated by \n\n. Example:
data: {"event": "message", "task_id": "900bbd43-dc0b-4383-a372-aa6e6c414227", "message_id": "663c5084", "answer": "Hi", "created_at": 1705398420}\n\n
Different event types in the stream:
| Event Type | Description | Fields |
|---|---|---|
| message | Text chunk from LLM | task_id, message_id, answer, created_at |
| message_end | Message completion | task_id, message_id, metadata, usage, retriever_resources |
| tts_message | Speech synthesis chunk | task_id, message_id, audio (base64), created_at |
| tts_message_end | Speech synthesis completion | task_id, message_id, audio (empty), created_at |
| message_replace | Content moderation replacement | task_id, message_id, answer, created_at |
| error | Stream error event | task_id, message_id, status, code, message |
| ping | Keep-alive ping (every 10s) | - |
Example Responses
For blocking mode:
{
"id": "0b089b9a-24d9-48cc-94f8-762677276261",
"answer": "Hello! How can I help you today?",
"created_at": 1679586667
}
For streaming mode:
data: {"event": "message", "task_id": "5ad4cb98", "message_id": "a8bdc41c", "answer": "Hello!", "created_at": 1721205487}
data: {"event": "tts_message", "task_id": "3bf8a0bb", "message_id": "a8bdc41c", "audio": "base64_audio_data", "created_at": 1721205487}
data: {"event": "message_end", "task_id": "5ad4cb98", "message_id": "a8bdc41c", "metadata": {}, "usage": {"total_tokens": 10}}
data: {"event": "tts_message_end", "task_id": "3bf8a0bb", "message_id": "a8bdc41c", "audio": "", "created_at": 1721205487}
Example Request
- Shell
- Python
- Node.js
curl -X POST 'https://platform.llmprovider.ai/v1/agent/completion-messages' \
--header 'Authorization: Bearer $YOUR_API_KEY' \
--header 'Content-Type: application/json' \
--data-raw '{
"model": "",
"inputs": {"query": "Hello, world!"},
"response_mode": "streaming",
"user": "abc-123"
}'
import requests
import json
api_key = 'YOUR_API_KEY'
url = 'https://platform.llmprovider.ai/v1/agent/completion-messages'
headers = {
'Authorization': f'Bearer {api_key}',
'Content-Type': 'application/json'
}
data = {
'model': '',
'inputs': {'query': 'Hello, world!'},
'response_mode': 'streaming',
'user': 'abc-123'
}
response = requests.post(url, headers=headers, json=data)
print(response.json())
const axios = require('axios');
const apiKey = 'YOUR_API_KEY';
const url = 'https://api.dify.ai/v1/agent/completion-messages';
const data = {
model: '',
inputs: {query: 'Hello, world!'},
response_mode: 'streaming',
user: 'abc-123'
};
const headers = {
'Authorization': `Bearer ${apiKey}`,
'Content-Type': 'application/json'
};
axios.post(url, data, {headers})
.then(response => console.log(response.data))
.catch(error => console.error(error));
Stop Response
POST https://platform.llmprovider.ai/v1/agent/completion-messages/:task_id/stop
Stop a streaming response. Only available for streaming mode.
Request Headers
| Header | Value |
|---|---|
| Authorization | Bearer YOUR_API_KEY |
| Content-Type | application/json |
Path Parameters
| Parameter | Type | Description |
|---|---|---|
| task_id | string | Task ID obtained from the streaming response |
Request Body Parameters
| Parameter | Type | Required | Description |
|---|---|---|---|
| model | string | Yes | agent name |
| user | string | Yes | User identifier (must match the message API user ID) |
Response
| Field | Type | Description |
|---|---|---|
| result | string | upload result |
Example Response
{
"result": "success"
}
Example Request
- Shell
- Python
- Node.js
curl -X POST 'https://platform.llmprovider.ai/v1/agent/completion-messages/task_123/stop' \
--header 'Authorization: Bearer $YOUR_API_KEY' \
--header 'Content-Type: application/json' \
--data-raw '{
"model": "",
"user": "abc-123"
}'
import requests
import json
api_key = 'YOUR_API_KEY'
url = 'https://platform.llmprovider.ai/v1/agent/completion-messages/task_123/stop'
headers = {
'Authorization': f'Bearer {api_key}',
'Content-Type': 'application/json'
}
data = {
'model': '',
'user': 'abc-123'
}
response = requests.post(url, headers=headers, json=data)
print(response.json())
const axios = require('axios');
const apiKey = 'YOUR_API_KEY';
const url = 'https://platform.llmprovider.ai/v1/agent/completion-messages/task_123/stop';
const data = {
model: '',
user: 'abc-123'
};
const headers = {
'Authorization': `Bearer ${apiKey}`,
'Content-Type': 'application/json'
};
axios.post(url, data, {headers})
.then(response => console.log(response.data))
.catch(error => console.error(error));
Message Feedback
POST https://platform.llmprovider.ai/v1/agent/messages/:message_id/feedbacks
Submit user feedback (likes/dislikes) for messages to help developers optimize output.
Path Parameters
| Parameter | Type | Description |
|---|---|---|
| message_id | string | Message ID |
Request Headers
| Header | Value |
|---|---|
| Authorization | Bearer YOUR_API_KEY |
| Content-Type | application/json |
Request Body Parameters
| Parameter | Type | Description |
|---|---|---|
| model | string | agent name |
| rating | string | Feedback type: "like", "dislike", or null |
| user | string | User identifier |
| content | string | Optional feedback details |
Response
| Field | Type | Description |
|---|---|---|
| result | string | result |
Example Response
{
"result": "success"
}
Example Request
- Shell
- Python
- Node.js
curl -X POST 'https://platform.llmprovider.ai/v1/agent/messages/msg_123/feedbacks' \
--header 'Authorization: Bearer $YOUR_API_KEY' \
--header 'Content-Type: application/json' \
--data-raw '{
"model": "",
"rating": "like",
"user": "abc-123",
"content": "This response was helpful"
}'
import requests
import json
api_key = 'YOUR_API_KEY'
url = 'https://platform.llmprovider.ai/v1/agent/messages/msg_123/feedbacks'
headers = {
'Authorization': f'Bearer {api_key}',
'Content-Type': 'application/json'
}
data = {
'model': '',
'rating': 'like',
'user': 'abc-123',
'content': 'This response was helpful'
}
response = requests.post(url, headers=headers, json=data)
print(response.json())
const axios = require('axios');
const apiKey = 'YOUR_API_KEY';
const url = 'https://platform.llmprovider.ai/v1/agent/messages/msg_123/feedbacks';
const data = {
model: '',
rating: 'like',
user: 'abc-123',
content: 'This response was helpful'
};
const headers = {
'Authorization': `Bearer ${apiKey}`,
'Content-Type': 'application/json'
};
axios.post(url, data, {headers})
.then(response => console.log(response.data))
.catch(error => console.error(error));